unsolicited image
The Morning After: iOS 17 offers better protection for unsolicited images
Receiving an unsolicited image is an unpleasant experience at the best of times, and one that technology has made all too common. At WWDC, Apple announced iOS 17 will use an on-device machine learning model to scan both images and videos for nudity. When detected, you'll get a pop-up, telling you the system thinks the file may be inappropriate. I wonder how much of this is a response to the practice of AirDropping inappropriate images to an unsuspecting person's phone. One notable incident from 2022 saw a person removed from a flight after they had shared an image of themselves with other passengers.
Technology: